Improved Back Propagation Algorithm to Avoid Local Minima in Multiplicative Neuron Model
نویسندگان
چکیده
The back propagation algorithm calculates the weight changes of artificial neural networks, and a common approach is to use a training algorithm consisting of a learning rate and a momentum factor. The major drawbacks of above learning algorithm are the problems of local minima and slow convergence speeds. The addition of an extra term, called a proportional factor reduces the convergence of the back propagation algorithm. We have applied the three term back propagation to multiplicative neural network learning. The algorithm is tested on XOR and parity problem and compared with the standard back propagation training algorithm. Keywords—Three term back propagation, multiplicative neural network, proportional factor, local minima.
منابع مشابه
Prediction of rotary machinery degradation status based on vibration data using back propagation (BP) neural network
Rotary machinery has to be maintained and repaired exactly once or several times before they totally failed to keep their good working status. In order to determine the maintenance interval of rotary machinery, machine condition should be predicted by some diagnostic methods and predictive methods. This study was conducted to establish the relationship between machine performance and machine vi...
متن کاملIIR System Identification Using Improved Harmony Search Algorithm with Chaos
Due to the fact that the error surface of adaptive infinite impulse response (IIR) systems is generally nonlinear and multimodal, the conventional derivative based techniques fail when used in adaptive identification of such systems. In this case, global optimization techniques are required in order to avoid the local minima. Harmony search (HS), a musical inspired metaheuristic, is a recently ...
متن کاملGradient Learning in Networks of Smoothly Spiking Neurons
A slightly simplified version of the Spike Response Model SRM0 of a spiking neuron is tailored to gradient learning. In particular, the evolution of spike trains along the weight and delay parameter trajectories is made perfectly smooth. For this model a back-propagation-like learning rule is derived which propagates the error also along the time axis. This approach overcomes the difficulties w...
متن کاملInitial Classification Through Back Propagation In a Neural Network Following Optimization Through GA to Evaluate the Fitness of an Algorithm
an Artificial Neural Network classifier is a nonparametric classifier. It does not need any priori knowledge regarding the statistical distribution of the class in a giver selected data Source. While, neural network can be trained to distinguish the criteria used to classify easily in a generalized manner that allows successful classification the newly arrived inputs not used during training. T...
متن کاملA New Back-Propagation Neural Network Optimized with Cuckoo Search Algorithm
Back-propagation Neural Network (BPNN) algorithm is one of the most widely used and a popular technique to optimize the feed forward neural network training. Traditional BP algorithm has some drawbacks, such as getting stuck easily in local minima and slow speed of convergence. Nature inspired meta-heuristic algorithms provide derivative-free solution to optimize complex problems. This paper pr...
متن کامل